AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
English fine-tuned base model

# English fine-tuned base model

Qwen2 96M
Apache-2.0
Qwen2-96M is a miniature language model based on the Qwen2 architecture, containing 96 million parameters and supporting a context length of 8192 tokens, suitable for English text generation tasks.
Large Language Model English
Q
Felladrin
76
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase